Sparse Kernel Learning and the Relevance Units Machine

نویسندگان

  • Junbin Gao
  • Jun Zhang
چکیده

The relevance vector machine(RVM) is a state-of-the-art constructing sparse regression kernel model [1,2,3,4]. It not only generates a much sparser model but provides better generalization performance than the standard support vector machine (SVM). In RVM and SVM, relevance vectors (RVs) and support vectors (SVs) are both selected from the input vector set. This may limit model flexibility. In this paper we propose a new sparse kernel model called Relevance Units Machine (RUM). RUM follows the idea of RVM under the Bayesian framework but releases the constraint that RVs have to be selected from the input vectors. RUM treats relevance units as part of the parameters of the model. As a result, a RUM maintains all the advantages of RVM and offers superior sparsity. The new algorithm is demonstrated to possess considerable computational advantages over well-known the state-of-the-art algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gene Identification from Microarray Data for Diagnosis of Acute Myeloid and Lymphoblastic Leukemia Using a Sparse Gene Selection Method

Background: Microarray experiments can simultaneously determine the expression of thousands of genes. Identification of potential genes from microarray data for diagnosis of cancer is important. This study aimed to identify genes for the diagnosis of acute myeloid and lymphoblastic leukemia using a sparse feature selection method. Materials and Methods: In this descriptive study, the expressio...

متن کامل

Adaptive spherical Gaussian kernel in sparse Bayesian learning framework for nonlinear regression

Kernel based machine learning techniques have been widely used to tackle problems of function approximation and regression estimation. Relevance vector machine (RVM) has state of the art performance in sparse regression. As a popular and competent kernel function in machine learning, conventional Gaussian kernel has unified kernel width with each of basis functions, which make impliedly a basic...

متن کامل

Sparse kernel learning with LASSO and Bayesian inference algorithm

Kernelized LASSO (Least Absolute Selection and Shrinkage Operator) has been investigated in two separate recent papers [Gao, J., Antolovich, M., & Kwan, P. H. (2008). L1 LASSO and its Bayesian inference. In W. Wobcke, & M. Zhang (Eds.), Lecture notes in computer science: Vol. 5360 (pp. 318-324); Wang, G., Yeung, D. Y., & Lochovsky, F. (2007). The kernel path in kernelized LASSO. In Internationa...

متن کامل

Multi-objective optimisation of relevance vector machines: selecting sparse features for face verification ICML/UAI/COLT 2008 Workshop on Sparse Optimisation and Variable Selection

The relevance vector machine (RVM) (Tipping, 2001) encapsulates a sparse probabilistic model for machine learning tasks. Like support vector machines, use of the kernel trick allows modelling in high dimensional feature spaces to be achieved at low computational cost. However, sparsity is controlled not just by the automatic relevance determination (ARD) prior but also by the choice of basis fu...

متن کامل

Bacterial Foraging Optimization Combined with Relevance Vector Machine with an Improved Kernel for Pressure Fluctuation of Hydroelectric Units

The optimization of kernel parameters is an important step in the application of the Relevance Vector Machine (RVM) for many real-world problems. In this paper, firstly we have developed an improved anisotropic Gaussian kernel as the kernel function of the RVM model, whose parameters are optimized by Bacterial Foraging Optimization (BFO). Then the proposed method is applied to describing the pr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009